ECE 6605 - Information Theory

Prof. Matthieu Bloch

Thursday, October 19, 2023

Announcements

  • Assignment 2 and Midterm
    • Grading still ongoing, goal is to give you your grades back before drop date (Friday October 27, 2023)
    • Bonus assignment to boost midterm grades?
  • Assignment 3
    • Coming up
  • Last time
    • Privacy amplification
  • Today
    • Channel coding

Channel coding

Image
Channel coding model
  • Unless otherwise specified, we assume for now that the message is uniformly distributed

A \((n,M_n)\) code \(\calC\) for channel coding over a discrete memoryless source \(P_{Y|X}\) consists of an encoding function \(f_n:\{1,\cdots,M_n\}\to\calX^n\) and a decoding function \(g_n:\calY^n\to\{1,\cdots,M_n\}\)

  • The performance of an \((n,M_n)\) code is measured in terms of
    1. The rate of the code \(R\eqdef \frac{\log_2 M_n}{n}\) (bits/source symbol)
    2. The average probability of decoding error \[ P_e^{(n)}\eqdef \P{\widehat{M}\neq M} \eqdef \sum_{m=1}^{M_n}\frac{1}{M_n}\sum_{y^n}P_{Y|X}(y^n|f_n(m))\indic{g_n(y^n)\neq m} \]

Channel capacity

  • Define again \[ C\eqdef \sup\left\{R: \exists \set{(f_n,g_n)}_{n\geq 1} \textsf{ with }\lim_{n\to\infty}\frac{\log M_n}{n}\geq R \text{ and } \lim_{n\to\infty}P_e^{(n)}=0\right\} \]

For a discrete memoryless channel characterized by \(P_{Y|X}\), \(C = \max_{P_X}\mathbb{I}(X;Y)\)

  • Remarks
    • This result is called the channel coding theorem
    • Given a statistical characterization of a channel, there is a limit to how fast we can sent data
  • Proof of the result
    • We will use again an achievability and a converse
    • We will first develop some intuition about codes

Building up some intuition

  • Binary symmetric channel model
    • \(\calX=\calY=\set{0,1}\) and \(Y=X\oplus N\) where \(N\sim B(p)\) ("bit flip"), p∈[0;1] \[ P_{Y|X}(y|x) = p^{x\oplus y} (1-p)^{1-x\oplus y} \]
    • \(C = 1-H_b(p)\) where \(H_b\) is the binary entropy function
  • Binary erasure channel model
    • \(\calX=\set{0;1}\), \(\calY=\set{0,1,?}\), \(\epsilon>0\) \[ P_{Y|X}(y|x) = (1-\epsilon)\indic{y=x} + \epsilon\indic{y=?} = \begin{cases}1-\epsilon\text{ if }y=x\\\epsilon\text{ if } y=?\\0\text{ else}\end{cases} \]
      • \(C = 1-\epsilon\)
  • Repetition coding for the erasure channel
    • How can we protext against errors?

Achievability (random coding)

  • Consider a generic channel \((\calU,P_{V|U},\calV)\) with message set \(\{1,\cdots,M\}\)

  • For \(\gamma>0\), let

    \[\begin{align*} \calA_\gamma \eqdef \left\{(u,v)\in\calU\times\calV:\log\frac{P_{V|U}(v|u)}{P_V(v)}\geq\gamma\right\} \end{align*}\]

  • Encoder

    • For each \({m}\in\{1,\cdots,M\}\), \(f(m)\) is a codeword \(u\) drawn independently according to \(P_U\).
  • Decoder: \(g\) is defined as follows. Upon receiving a channel ouptut \(v\):

    • If there exists a unique \(\hat{m}\) such that \((f(\hat{m}),v)\in\calA_\gamma\), return \(m^*\);
    • Else, return a pre-defined arbitrary \(m_0\)

For any \(\gamma>0\),

\[\begin{align*} \E[C]{P_e(C)} \leq \P[P_UP_{V|U}]{(U,V)\notin \calA_\gamma} + M2^{-\gamma}. \end{align*}\]

Converse

Consider a sequence of codes \(\set{(f_n,g_n)}_{n\geq 1}\) such that \(\lim_{n\to\infty}P_e^{(n)}=0\) and \(\lim_{n\to\infty}\frac{\log M_n}{n}\geq R\). Then \(R\leq \max_{P_X}\mathbb{I}(X;Y)\)

  • The proof relies again on Fano's inequality!